From 1 - 10 / 85
  • The Australian National Gravity Database (ANGD) contains over 1.8 million gravity observations from over 2,000 surveys conducted in Australia over the last 80 years. Three processes are required to correct these observations for the effects of the surrounding topography: firstly a Bouguer correction (Bullard A), which approximates the topography as an infinite horizontal slab; secondly a correction to that horizontal slab for the curvature of the Earth (Bullard B); and thirdly a terrain correction (Bullard C), which accounts for the undulations in the surrounding topography. These three corrections together produce complete bouguer anomalies. Since February 2008, a spherical cap bouguer anomaly calculation has been applied to data extracted from the ANGD. This calculation applies the Bullard A and Bullard B corrections. Terrain corrections, Bullard C, have now been calculated for all terrestrial gravity observations in the ANGD allowing the calculation of complete bouguer anomalies. These terrain corrections were calculated using the Shuttle Radar Topography Mission 3 arc-second digital elevation data. The complete bouguer anomalies calculated for the ANGD provide users of the data with a more accurate representation of crustal density variations through the application of a more accurate Earth model to the gravity observations.

  • In this age of state-of-the-art devices producing analytical results with little input from analytical specialists, how do we know that the results produced are correct? When reporting the result of a measurement of a physical quantity, it is obligatory that some quantitative indication of the quality of the result be given so that those who use it can assess its reliability. Without such an indication, measurement results cannot be compared, either among themselves or with reference values given in a specification or standard. It is therefore necessary that there be a readily implemented, easily understood, and generally accepted procedure for characterising the quality of a result of a measurement, that is, for evaluating and expressing its 'uncertainty'. The concept of 'uncertainty' as quantifiable attribute is relatively new in the history of measurement, although error and error analysis have long been part of the practice of measurement science or 'metrology'. It is now widely recognised that, when all of the known or suspected components of error have been evaluated and the appropriate corrections have been applied, there still remains an uncertainty about the correctness of the stated result, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. This presentation will discuss the latest practices for the production of 'reliable' geochemical data that are associated with small measurement uncertainties, and will provide an overview of current understanding of metrological traceability and the proper use of reference materials. Correct use of reference materials will be discussed, as well as the role of measurement uncertainty and how it is affected by such issues as sample preparation, sample heterogeneity and data acquisition.

  • We propose an automated capture system that follows the fundamental scientific methodology. It starts with the instrument that captures the data, uses web services to make standardised data reduction programs more widely accessible, and finally uses internationally agreed data transfer standards to make geochemical data seamlessly accessible online from a series of internationally distributed certified repositories. The Australian National Data Service (http://www.ands.org.au/) is funding a range of data capture solutions to ensure that the data creation and data capture phases of research are fully integrated to enable effective ingestion into research data and metadata stores at the institution or elsewhere. They are developing a national discovery service that enables access to data in institutional stores with rich context. No data is stored in this system, only metadata with pointers back to the original data. This enables researchers to keep their own data but also enables access to many repositories at once. Such a system will require standardisation at all phases of the process of analytical geochemistry. The geochemistry community needs to work together to develop standards for attributes as the data are collected from the instrument, to develop more standardised processing of the raw data and to agree on what is required for publishing. An online-collaborative workspace such as this would be ideal for geochemical data and the provision of standardised, open source software would greatly enhance the persistence of individual geochemistry data collections and facilitate reuse and repurposing. This conforms to the guidelines from Geoinformatics for Geochemistry (http://www.geoinfogeochem.org/) which requires metadata on how the samples were analysed.

  • It is impractical for a single agency in Australia to hold responsibility for maintaining a national landslide database. Geoscience Australia has successfully demonstrated the benefits of adopting information management strategies as one solution in bringing local, regional and national scale landslide data together. In the first time that networked service oriented interoperability has been applied to a natural hazards domain, Australia now has an up-to-date central landslide database that makes full use of diverse data across three levels of government . The approach is centred upon a 'common data model' that addresses aspects of landslides captured by different agencies. The methodology brings four distinct components together: a landslide application schema; a landslide domain model; web service implementations and a user interface. Sharing and exchanging data more efficiently through an interoperable approach ensures that full value is made of available information, and that responsibility for collecting and maintaining this data is shared across all agencies. Specific-purpose data not only continues to serve the needs of individual database custodians, but also now serves a broader need. Such a system establishes the foundation for a very powerful and coordinated information resource in Australia through its ability to collate and characterise large volumes of information, and provides a suitable basis for greater investment in data collection. At a minimum the pilot project provides Australia with a framework for a centralised national landslide inventory, which can connect other available landslide databases. There is also considerable capacity for this approach to provide State Governments with a simple way to compile and maintain their own state-wide databases, and to extend the approach across other natural hazard databases and integrate data from other domains.

  • Proceedings of the Second National Forum on GIS in the Geosciences, 29 - 31 March 1995, held at the National Library of Australia.

  • Part-page item on matters relating to stratigraphic nomenclature and the Australian Stratigraphic Units Database (ASUD). This column (59) discusses names that do not meet the recommendations of the current International Stratigraphic Guide, and why they are in the ASUD database. ISSN 0312 4711

  • Quarterly column on issues in Australian stratigraphy

  • Marine science is expensive. Duplication of research activities is potentially money wasted. Not being aware of other marine science studies could question the validity of findings made in single-discipline studies. A simple means of discovery is needed. The development of Earth Browsers (principally Google Earth) and KML (Keyhole Markup Language) files offer a possible solution. Google Earth is easy to use, and KML files are relatively simple, ASCII, XML-tagged files that can encode locations (points, lines and polygons), relevant metadata for presentation in descriptive 'balloons', and links to digital sources (data, publications, web-pages, etc). A suite of studies will be presented showing how information relating to investigations at a point (e.g. observation platform), along a line (e.g. ship borne survey) or over a region (e.g. satellite imagery) can be presented in a small (10 Kbyte) file. The information will cover a range of widely used data types including seismic data, underwater video, image files, documents and spreadsheets. All will be sourced directly from the web and can be downloaded from within the browser to one's desktop for analysis with appropriate applications. To be useful, this methodology requires data and metadata to be properly managed; and a degree of cooperation between major marine science organizations which could become 'sponsors' of the principal marine science disciplines (i.e oceanography, marine biology, geoscience). This need not be a complex task in many cases. The partitioning of the sciences is not important, so long as the information is being managed effectively and their existence is widely advertised. KML files provide a simple way of achieving this. The various discipline-based KML files could be hosted by an umbrella organization such as the AODCJF, enabling it to become a 'one-stop-shop' for marine science data.

  • The AGSO Web server now has a page that allows public access to many of AGSO's Oracle database lookup tables. These tables are the key to the nomenclature and classifications used in our geoscientific databases, and provide a valuable resource to many Australian geologists. For example, the geological time scale table provides a comprehensive list of time terms used in Australia and elsewhere, their rank, scope, parent term and older and younger age boundaries in millions of years PB - according to the latest information. Or, the OZMIN mineral deposits attributes table, with nearly 2000 terms, provides a complete and authoritative classification of Australian ore deposits, as well as other attributes such as alteration, mineralisation style, gangue minerals, ore texture and relationships to host. with nearly 4500 terms, the largest of the 37 tables so far included is the extent-names table for our metadata system. The smallest, with just 9 terms, is the analyte categories table for the GWATER database. The table may be downloaded from the Web, or alternatively you may purchase them as ASCII files, as per AGSO Catalog No. 24488.

  • Geoscience data standards as a field of research may come as a surprise to many geoscientists, who probably think of it as a dull peripheral issue, of little relevance to their domain. However, the subject is gaining rapidly in importance as the information revolution begins to take hold, as ultimately billions of dollars worth of information are at stake. In this article we take a look at what has happened recently in this field, where we think it is heading, and AGSO's role in national geoscience standards.